AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Monolingual BERT

# Monolingual BERT

Hplt Bert Base Sk
Apache-2.0
A monolingual Slovak BERT model released by the HPLT project, trained on the LTG-BERT architecture, suitable for masked language modeling tasks
Large Language Model Transformers Other
H
HPLT
23
2
Elhberteu
ElhBERTeu is a BERT model developed for the Basque language, trained on multi-domain corpora and demonstrating excellent performance in the BasqueGLUE benchmark.
Large Language Model Transformers Other
E
orai-nlp
529
2
Bert Fa Zwnj Base
Apache-2.0
A Persian language understanding model based on Transformer architecture, capable of handling zero-width non-joiner issues in Persian writing
Large Language Model Other
B
HooshvareLab
5,590
15
Bert Base Irish Cased V1
gaBERT is a BERT-based monolingual Irish language model trained on 7.9 million Irish sentences, suitable for fine-tuning downstream tasks in Irish.
Large Language Model Transformers
B
DCU-NLP
42
5
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase